Convergence Conditions for Frequency Sensitive Competitive Learning
نویسندگان
چکیده
We present suucient and necessary conditions for the convergence of Frequency Sensitive Competitive Learning (FSCL) algorithm to a local equilibrium. The nal phase of the FSCL convergence is analyzed by describing the process with a Fokker-Plank equation. The analysis parallels that by Ritter and Schulten for the KSFM algorithm. We show that the convergence conditions involve only the learning rate and that they are the same as the conditions for weak convergence described previously. Our analysis thus broadens the class of algorithms that have been shown to have these types of convergence properties.
منابع مشابه
Diffusion approximation of frequency sensitive competitive learning
The focus of this paper is a convergence study of the frequency sensitive competitive learning (FSCL) algorithm. We approximate the final phase of FSCL learning by a diffusion process described by the Fokker-Plank equation. Sufficient and necessary conditions are presented for the convergence of the diffusion process to a local equilibrium. The analysis parallels that by Ritter-Schulten (1988) ...
متن کاملImage Compression Using Learned Vector Quantization
This paper presents a study and implementation of still image compression using learned vector quantization. Grey scale, still images are compressed by 16:1 and transmitted at 0.5 bits per pixel, while maintaining a peak signal-to-noise ratio of 30 dB. The vector quantization is learned using Kohonen’s self organizing feature map (SOFM). While not only being representative of the training set, ...
متن کاملForced Information for Information-Theoretic Competitive Learning
We have proposed a new information-theoretic approach to competitive learning [1], [2], [3], [4], [5]. The information-theoretic method is a very flexible type of competitive learning, compared with conventional competitive learning. However, some problems have been pointed out concerning the information-theoretic method, for example, slow convergence. In this paper, we propose a new computatio...
متن کاملCodeword Distribution for Frequency Sensitive CompetitiveLearning with One Dimensional Input
We study the codeword distribution for a consciense type competitive learning algorithm, Frequency Sensitive Competitive Learning (FSCL), using one dimensional input data. We prove that the asymptotic codeword density in the limit of large number of codewords is given by a power law of the form Q(x) = C P (x) , where P (x) is the input data density and depends on the algorithm and the form of t...
متن کاملHigh-dimensional clustering using frequency sensitive competitive learning
In this paper a clustering algorithm for sparsely sampled high-dimensional feature spaces is proposed. The algorithm performs clustering by employing a distance measure that compensates for diierently sized clusters. A sequential version of the algorithm is constructed in the form of a frequency sensitive Competitive Learning scheme. Experiments are conducted on an artiicial gaussian data set a...
متن کامل